Security News
Bun 1.2 Released with 90% Node.js Compatibility and Built-in S3 Object Support
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.
@modelcontextprotocol/sdk
Advanced tools
The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This TypeScript SDK implements the full MCP specification, making it easy to:
npm install @modelcontextprotocol/sdk
Let's create a simple MCP server that exposes a calculator tool and some data:
import { McpServer, ResourceTemplate } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";
// Create an MCP server
const server = new McpServer({
name: "Demo",
version: "1.0.0"
});
// Add an addition tool
server.tool("add",
{ a: z.number(), b: z.number() },
async ({ a, b }) => ({
content: [{ type: "text", text: String(a + b) }]
})
);
// Add a dynamic greeting resource
server.resource(
"greeting",
new ResourceTemplate("greeting://{name}", { list: undefined }),
async (uri, { name }) => ({
contents: [{
uri: uri.href,
text: `Hello, ${name}!`
}]
})
);
The Model Context Protocol (MCP) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
The McpServer is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:
const server = new McpServer({
name: "My App",
version: "1.0.0"
});
Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:
// Static resource
server.resource(
"config",
"config://app",
async (uri) => ({
contents: [{
uri: uri.href,
text: "App configuration here"
}]
})
);
// Dynamic resource with parameters
server.resource(
"user-profile",
new ResourceTemplate("users://{userId}/profile", { list: undefined }),
async (uri, { userId }) => ({
contents: [{
uri: uri.href,
text: `Profile data for user ${userId}`
}]
})
);
Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:
// Simple tool with parameters
server.tool(
"calculate-bmi",
{
weightKg: z.number(),
heightM: z.number()
},
async ({ weightKg, heightM }) => ({
content: [{
type: "text",
text: String(weightKg / (heightM * heightM))
}]
})
);
// Async tool with external API call
server.tool(
"fetch-weather",
{ city: z.string() },
async ({ city }) => {
const response = await fetch(`https://api.weather.com/${city}`);
const data = await response.text();
return {
content: [{ type: "text", text: data }]
};
}
);
Prompts are reusable templates that help LLMs interact with your server effectively:
server.prompt(
"review-code",
{ code: z.string() },
({ code }) => ({
messages: [{
role: "user",
content: {
type: "text",
text: `Please review this code:\n\n${code}`
}
}]
})
);
A simple server demonstrating resources, tools, and prompts:
import { McpServer, ResourceTemplate } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";
const server = new McpServer({
name: "Echo",
version: "1.0.0"
});
server.resource(
"echo",
new ResourceTemplate("echo://{message}", { list: undefined }),
async (uri, { message }) => ({
contents: [{
uri: uri.href,
text: `Resource echo: ${message}`
}]
})
);
server.tool(
"echo",
{ message: z.string() },
async ({ message }) => ({
content: [{ type: "text", text: `Tool echo: ${message}` }]
})
);
server.prompt(
"echo",
{ message: z.string() },
({ message }) => ({
messages: [{
role: "user",
content: {
type: "text",
text: `Please process this message: ${message}`
}
}]
})
);
A more complex example showing database integration:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import sqlite3 from "sqlite3";
import { promisify } from "util";
import { z } from "zod";
const server = new McpServer({
name: "SQLite Explorer",
version: "1.0.0"
});
// Helper to create DB connection
const getDb = () => {
const db = new sqlite3.Database("database.db");
return {
all: promisify<string, any[]>(db.all.bind(db)),
close: promisify(db.close.bind(db))
};
};
server.resource(
"schema",
"schema://main",
async (uri) => {
const db = getDb();
try {
const tables = await db.all(
"SELECT sql FROM sqlite_master WHERE type='table'"
);
return {
contents: [{
uri: uri.href,
text: tables.map((t: {sql: string}) => t.sql).join("\n")
}]
};
} finally {
await db.close();
}
}
);
server.tool(
"query",
{ sql: z.string() },
async ({ sql }) => {
const db = getDb();
try {
const results = await db.all(sql);
return {
content: [{
type: "text",
text: JSON.stringify(results, null, 2)
}]
};
} catch (err: unknown) {
const error = err as Error;
return {
content: [{
type: "text",
text: `Error: ${error.message}`
}],
isError: true
};
} finally {
await db.close();
}
}
);
For more control, you can use the low-level Server class directly:
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
ListPromptsRequestSchema,
GetPromptRequestSchema
} from "@modelcontextprotocol/sdk/types.js";
const server = new Server(
{
name: "example-server",
version: "1.0.0"
},
{
capabilities: {
prompts: {}
}
}
);
server.setRequestHandler(ListPromptsRequestSchema, async () => {
return {
prompts: [{
name: "example-prompt",
description: "An example prompt template",
arguments: [{
name: "arg1",
description: "Example argument",
required: true
}]
}]
};
});
server.setRequestHandler(GetPromptRequestSchema, async (request) => {
if (request.params.name !== "example-prompt") {
throw new Error("Unknown prompt");
}
return {
description: "Example prompt",
messages: [{
role: "user",
content: {
type: "text",
text: "Example prompt text"
}
}]
};
});
const transport = new StdioServerTransport();
await server.connect(transport);
The SDK provides a high-level client interface:
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";
const transport = new StdioClientTransport({
command: "node",
args: ["server.js"]
});
const client = new Client(
{
name: "example-client",
version: "1.0.0"
},
{
capabilities: {
prompts: {},
resources: {},
tools: {}
}
}
);
await client.connect(transport);
// List prompts
const prompts = await client.listPrompts();
// Get a prompt
const prompt = await client.getPrompt("example-prompt", {
arg1: "value"
});
// List resources
const resources = await client.listResources();
// Read a resource
const resource = await client.readResource("file:///example.txt");
// Call a tool
const result = await client.callTool("example-tool", {
arg1: "value"
});
Issues and pull requests are welcome on GitHub at https://github.com/modelcontextprotocol/typescript-sdk.
This project is licensed under the MIT License—see the LICENSE file for details.
FAQs
Model Context Protocol implementation for TypeScript
The npm package @modelcontextprotocol/sdk receives a total of 38,636 weekly downloads. As such, @modelcontextprotocol/sdk popularity was classified as popular.
We found that @modelcontextprotocol/sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Bun 1.2 enhances its JavaScript runtime with 90% Node.js compatibility, built-in S3 and Postgres support, HTML Imports, and faster, cloud-first performance.
Security News
Biden's executive order pushes for AI-driven cybersecurity, software supply chain transparency, and stronger protections for federal and open source systems.
Security News
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.